A touchscreen is an electronic visual display that can detect the presence and location of a touch within the display area. The term generally refers to touching the display of the device with a finger or hand. Touchscreens can also sense other passive objects, such as a stylus. Touchscreens are common in devices such as all-in-one computers, tablet computers, and smartphones.
The touchscreen has two main attributes. First, it enables one to interact directly with what is displayed, rather than indirectly with a pointer controlled by a mouse or touchpad. Secondly, it lets one do so without requiring any intermediate device that would need to be held in the hand (other than a stylus, which is optional for most modern touchscreens). Such displays can be attached to computers, or to networks as terminals. They also play a prominent role in the design of digital appliances such as the personal digital assistant (PDA), satellite navigation devices, mobile phones, and video games.
Contents |
The first touch screen was a capacitive touch screen developed by E.A. Johnson at the Royal Radar Establishment, Malvern, UK. The inventor briefly described his work in a short article published in 1965[5] and then more fully - along with photographs and diagrams - in an article published in 1967.[6] A description of the applicability of the touch technology for air traffic control was described in an article published in 1968.[7]
Contrary to many accounts,[8] while Dr. Sam Hurst played an important role in the development of touch technologies, he neither invented the first touch sensor, nor the first touch screen.
Touchscreens have subsequently become familiar in everyday life. Companies use touchscreens for kiosk systems in retail and tourist settings, point of sale systems, ATMs, and PDAs, where a stylus is sometimes used to manipulate the GUI and to enter data.
From 1979–1985, the Fairlight CMI (and Fairlight CMI IIx) was a high-end musical sampling and re-synthesis workstation that utilized light pen technology, with which the user could allocate and manipulate sample and synthesis data, as well as access different menus within its OS by touching the screen with the light pen. The later Fairlight series IIT models used a graphics tablet in place of the light pen.
The HP-150 from 1983 was one of the world's earliest commercial touchscreen computers. Similar to the PLATO IV system, the touch technology used employed infrared transmitters and receivers mounted around the bezel of its 9" Sony Cathode Ray Tube (CRT), which detected the position of any non-transparent object on the screen.
An early attempt at a handheld game console with touchscreen controls was Sega's intended successor to the Game Gear, though the device was ultimately shelved and never released due to the expensive cost of touchscreen technology in the early 1990s. Touchscreens would not be popularly used for video games until the release of the Nintendo DS in 2004.[9]
Until recently, most consumer touchscreens could only sense one point of contact at a time, and few have had the capability to sense how hard one is touching. This is starting to change with the commercialization of multi-touch technology.
The popularity of smartphones, tablet computers, portable video game consoles and many types of information appliances is driving the demand and acceptance of common touchscreens, for portable and functional electronics. With a display of a simple smooth surface, and direct interaction without any hardware (keyboard or mouse) between the user and content, fewer accessories are required.
Touchscreens are popular in the hospitality field, and in heavy industry, as well as kiosks such as museum displays or room automation, where keyboard and mouse systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-based firmware have been made available by a wide array of after-market system integrators, and not by display, chip, or motherboard manufacturers. Display manufacturers and chip manufacturers worldwide have acknowledged the trend toward acceptance of touchscreens as a highly desirable user interface component and have begun to integrate touchscreens into the fundamental design of their products.
There are a variety of touchscreen technologies:
A resistive touchscreen panel comprises several layers, the most important of which are two thin, transparent electrically-resistive layers separated by a thin space. These layers face each other, with a thin gap between. One resistive layer is a coating on the underside of the top surface of the screen. Just beneath it is a similar resistive layer on top of its substrate. One layer has conductive connections along its sides, the other along top and bottom.
When an object, such as a fingertip or stylus tip, presses down on the outer surface, the two layers touch to become connected at that point: The panel then behaves as a pair of voltage dividers, one axis at a time. For a short time, the associated electronics (device controller) applies a voltage to the opposite sides of one layer, while the other layer senses the proportion (think percentage) of voltage at the contact point. That provides the horizontal [x] position. Then, the controller applies a voltage to the top and bottom edges of the other layer (the one that just sensed the amount of voltage); the first layer now senses height [y]. The controller rapidly alternates between these two modes. As well, it sends position data to the CPU in the device, where it's interpreted according to what the user is doing.
Resistive touch is used in restaurants, factories and hospitals due to its high resistance to liquids and contaminants. A major benefit of resistive touch technology is its low cost. Disadvantages include the need to press down, and a risk of damage by sharp objects.
Surface acoustic wave (SAW) technology uses ultrasonic waves that pass over the touchscreen panel. When the panel is touched, a portion of the wave is absorbed. This change in the ultrasonic waves registers the position of the touch event and sends this information to the controller for processing. Surface wave touchscreen panels can be damaged by outside elements. Contaminants on the surface can also interfere with the functionality of the touchscreen.[10]
A capacitive touchscreen panel consists of an insulator such as glass, coated with a transparent conductor such as indium tin oxide (ITO).[11][12] As the human body is also an electrical conductor, touching the surface of the screen results in a distortion of the screen's electrostatic field, measurable as a change in capacitance. Different technologies may be used to determine the location of the touch. The location is then sent to the controller for processing. Unlike a resistive touchscreen, one cannot use a capacitive touchscreen through most types of electrically insulating material, such as gloves; one requires a special capacitive stylus, or a special-application glove with an embroidered patch of conductive thread passing through it and contacting the user's fingertip. This disadvantage especially affects usability in consumer electronics, such as touch tablet PCs and capacitive smartphones in cold weather.
In this basic technology, only one side of the insulator is coated with a conductive layer. A small voltage is applied to the layer, resulting in a uniform electrostatic field. When a conductor, such as a human finger, touches the uncoated surface, a capacitor is dynamically formed. The sensor's controller can determine the location of the touch indirectly from the change in the capacitance as measured from the four corners of the panel. As it has no moving parts, it is moderately durable but has limited resolution, is prone to false signals from parasitic capacitive coupling, and needs calibration during manufacture. It is therefore most often used in simple applications such as industrial controls and kiosks.[13]
Projected Capacitive Touch (PCT) technology is a capacitive technology which permits more accurate and flexible operation, by etching the conductive layer. An X-Y grid is formed either by etching a single layer to form a grid pattern of electrodes, or by etching two separate, perpendicular layers of conductive material with parallel lines or tracks to form the grid (comparable to the pixel grid found in many LCD displays) that the conducting layers can be coated with further protective insulating layers, and operate even under screen protectors, or behind weather- and vandal-proof glass. Due to the top layer of a PCT being glass, PCT is a more robust solution versus resistive touch technology. Depending on the implementation, an active or passive stylus can be used instead of or in addition to a finger. This is common with point of sale devices that require signature capture. Gloved fingers may or may not be sensed, depending on the implementation and gain settings. Conductive smudges and similar interference on the panel surface can interfere with the performance. Such conductive smudges come mostly from sticky or sweaty finger tips, especially in high humidity environments. Collected dust, which adheres to the screen due to the moisture from fingertips can also be a problem. There are two types of PCT: Self Capacitance and Mutual Capacitance.
In mutual capacitive sensors, there is a capacitor at every intersection of each row and each column. A 16-by-14 array, for example, would have 224 independent capacitors. A voltage is applied to the rows or columns. Bringing a finger or conductive stylus close to the surface of the sensor changes the local electrostatic field which reduces the mutual capacitance. The capacitance change at every individual point on the grid can be measured to accurately determine the touch location by measuring the voltage in the other axis. Mutual capacitance allows multi-touch operation where multiple fingers, palms or styli can be accurately tracked at the same time.
Self-capacitance sensors can have the same X-Y grid as mutual capacitance sensors, but the columns and rows operate independently. With self-capacitance, the capacitive load of a finger is measured on each column or row electrode by a current meter. This method produces a stronger signal than mutual capacitance, but it is unable to resolve accurately more than one finger, which results in "ghosting", or misplaced location sensing.
An infrared touchscreen uses an array of X-Y infrared LED and photodetector pairs around the edges of the screen to detect a disruption in the pattern of LED beams. These LED beams cross each other in vertical and horizontal patterns. This helps the sensors pick up the exact location of the touch. A major benefit of such a system is that it can detect essentially any input including a finger, gloved finger, stylus or pen. It is generally used in outdoor applications and point of sale systems which can't rely on a conductor (such as a bare finger) to activate the touchscreen. Unlike capacitive touchscreens, infrared touchscreens do not require any patterning on the glass which increases durability and optical clarity of the overall system.
This is a relatively modern development in touchscreen technology, in which two or more image sensors are placed around the edges (mostly the corners) of the screen. Infrared back lights are placed in the camera's field of view on the other side of the screen. A touch shows up as a shadow and each pair of cameras can then be pinpointed to locate the touch or even measure the size of the touching object (see visual hull). This technology is growing in popularity, due to its scalability, versatility, and affordability, especially for larger units.
Introduced in 2002 by 3M, this system uses sensors to detect the Piezoelectricity in the glass that occurs due to a touch. Complex algorithms then interpret this information and provide the actual location of the touch.[14] The technology claims to be unaffected by dust and other outside elements, including scratches. Since there is no need for additional elements on screen, it also claims to provide excellent optical clarity. Also, since mechanical vibrations are used to detect a touch event, any object can be used to generate these events, including fingers and stylus. A downside is that after the initial touch the system cannot detect a motionless finger.
In this system, introduced by Tyco International's Elo division in 2006, the key to the invention is that a touch at each position on the glass generates a unique sound. Four tiny transducers attached to the edges of the touchscreen glass pick up the sound of the touch. The sound is then digitized by the controller and compared to a list of prerecorded sounds for every position on the glass. The cursor position is instantly updated to the touch location. APR is designed to ignore extraneous and ambient sounds, as they do not match a stored sound profile. APR differs from other attempts to recognize the position of touch with transducers or microphones, as it uses a simple table lookup method rather than requiring powerful and expensive signal processing hardware to attempt to calculate the touch location without any references.[15] The touchscreen itself is made of ordinary glass, giving it good durability and optical clarity. It is usually able to function with scratches and dust on the screen with good accuracy. The technology is also well suited to displays that are physically larger. As with the Dispersive Signal Technology system, after the initial touch, a motionless finger cannot be detected. However, for the same reason, the touch recognition is not disrupted by any resting objects.
There are several principal ways to build a touchscreen. The key goals are to recognize one or more fingers touching a display, to interpret the command that this represents, and to communicate the command to the appropriate application.
In the most popular techniques, the capacitive or resistive approach, there are typically four layers;
When a user touches the surface, the system records the change in the electrical current that flows through the display.
Dispersive-signal technology which 3M created in 2002, measures the piezoelectric effect — the voltage generated when mechanical force is applied to a material — that occurs chemically when a strengthened glass substrate is touched.
There are two infrared-based approaches. In one, an array of sensors detects a finger touching or almost touching the display, thereby interrupting light beams projected over the screen. In the other, bottom-mounted infrared cameras record screen touches.
In each case, the system determines the intended command based on the controls showing on the screen at the time and the location of the touch.
Most touchscreen patents were filed during the 1970s and 1980s and have expired. Touchscreen component manufacturing and product design are no longer encumbered by royalties or legalities with regard to patents and the use of touchscreen-enabled displays is widespread.
The development of multipoint touchscreens facilitated the tracking of more than one finger on the screen; thus, operations that require more than one finger are possible. These devices also allow multiple users to interact with the touchscreen simultaneously.
With the growing use of touchscreens, the marginal cost of touchscreen technology is routinely absorbed into the products that incorporate it and is nearly eliminated. Touchscreens now have proven reliability. Thus, touchscreen displays are found today in airplanes, automobiles, gaming consoles, machine control systems, appliances, and handheld display devices including the Nintendo DS and the later multi-touch enabled iPhones; the touchscreen market for mobile devices is projected to produce US$5 billion in 2009.[16]
The ability to accurately point on the screen itself is also advancing with the emerging graphics tablet/screen hybrids.
October 2011: TapSense can distinguish between different parts of the hand, such as fingertip and fingernail, so it can be functioned as lower case and capital letter instruction.[17]
These ergonomic issues of direct touch can be bypassed by using a different technique, provided that the user's fingernails are either short or sufficiently long. Rather than pressing with the soft skin of an outstretched fingertip, the finger is curled over, so that the tip of a fingernail can be used instead. This method does not work on capacitive touchscreens.
The fingernail's hard, curved surface contacts the touchscreen at one very small point. Therefore, much less finger pressure is needed, much greater precision is possible (approaching that of a stylus, with a little experience), much less skin oil is smeared onto the screen, and the fingernail can be silently moved across the screen with very little resistance, allowing for selecting text, moving windows, or drawing lines.
The human fingernail consists of keratin which has a hardness and smoothness similar to the tip of a stylus (and so will not typically scratch a touchscreen). Alternatively, very short stylus tips are available, which slip right onto the end of a finger; this increases visibility of the contact point with the screen.
Touchscreens can suffer from the problem of fingerprints on the display. This can be mitigated by the use of materials with optical coatings designed to reduce the visible effects of fingerprint oils, or oleophobic coatings as used in the iPhone 3G S, which lessen the actual amount of oil residue, or by installing a matte-finish anti-glare screen protector, which creates a slightly roughened surface that does not easily retain smudges, or by reducing skin contact by using a fingernail or stylus.
Touchscreens are often used with haptic response systems. An example of this technology would be a system that caused the device to vibrate when a button on the touchscreen was tapped. The user experience with touchscreens lacking tactile feedback or haptics can be difficult due to latency or other factors. Research from the University of Glasgow Scotland [Brewster, Chohan, and Brown 2007 and more recently Hogan] demonstrates that sample users reduce input errors (20%), increase input speed (20%), and lower their cognitive load (40%) when touchscreens are combined with haptics or tactile feedback [vs. non-haptic touchscreens].
The Jargon File dictionary of hacker slang defined "gorilla arm" as the failure to understand the ergonomics of vertically mounted touchscreens for prolonged use. The proposition is that the human arm held in an unsupported horizontal position rapidly becomes fatigued and painful, the so-called "gorilla arm".[18] It is often cited as a prima facie example of what not to do in ergonomics. Vertical touchscreens still dominate in applications such as ATMs and data kiosks in which the usage is too brief to be an ergonomic problem.
Discomfort might be caused by previous poor posture and atrophied muscular systems caused by limited physical exercise.[19] Fine art painters are also often subject to neck and shoulder pains due to their posture and the repetitiveness of their movements while painting.[20]
Some touchscreens, primarily those employed in smartphones, use transparent plastic protectors to prevent any scratches that might be caused by day-to-day use from becoming permanent.
|